Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Application of Transformer optimized by pointer generator network and coverage loss in field of abstractive text summarization
LI Xiang, WANG Weibing, SHANG Xueda
Journal of Computer Applications    2021, 41 (6): 1647-1651.   DOI: 10.11772/j.issn.1001-9081.2020091375
Abstract480)      PDF (836KB)(490)       Save
Aiming at the application scenario of abstractive text summarization, a Transformer-based summarization model with Pointer Generator network and Coverage Loss added to the Transformer model for optimization was proposed. First, the method based on the Transformer model as the basic structure was proposed, and its attention mechanism was used to better capture the semantic information of the context. Then, the Coverage Loss was introduced into the loss function of the model to punish the distribution and coverage of repeated words, so as to solve the problem that the attention mechanism in the Transformer model continuously generates the same word in abstractive tasks. Finally, the Pointer Generator network was added to the model, which allowed the model to copy words from the source text as generated words to solve the Out of Vocabulary (OOV) problem. Whether the improved model reduced inaccurate expressions and whether the phenomenon of repeated occurrence of the same word was solved were explored. Compared with the original Transformer model, the improved model improved the score on ROUGE-L evaluation function by 1.98 percentage points, the score on ROUGE-2 evaluation function by 0.95 percentage points, and the score on ROUGE-L evaluation function by 2.27 percentage points, and improved the readability and accuracy of the summarization results. Experimental results show that Transformer can be applied to the field of abstractive text summarization after adding Coverage Loss and Pointer Generator network.
Reference | Related Articles | Metrics